Nonlinear Rescaling as Interior Quadratic Prox Method in Convex Optimization

نویسنده

  • Roman A. Polyak
چکیده

A class Ψ of strictly concave and twice continuously differentiable functions ψ : R → R with particular properties is used for constraint transformation in the framework of a Nonlinear Rescaling (NR) method with “dynamic” scaling parameter updates. We show that the NR method is equivalent to the Interior Quadratic Prox method for the dual problem in a rescaled dual space. The equivalence is used to prove convergence and to estimate the rate of convergence of the NR method and its dual equivalent under very mild assumptions on the input data for a wide class Ψ of constraint transformations. It is also used to estimate the rate of convergence under strict complementarity and under the standard second order optimality condition. We proved that for any ψ ∈ Ψ, which corresponds to a well-defined dual kernel φ = −ψ∗, the NR method applied to LP generates a quadratically convergent dual sequence if the dual LP has a unique solution. 1

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear rescaling vs. smoothing technique in convex optimization

We introduce an alternative to the smoothing technique approach for constrained optimization. As it turns out for any given smoothing function there exists a modification with particular properties. We use the modification for Nonlinear Rescaling (NR) the constraints of a given constrained optimization problem into an equivalent set of constraints. The constraints transformation is scaled by a ...

متن کامل

Lagrangian Transformation and Interior Ellipsoid Methods in Convex Optimization

The rediscovery of the affine scaling method in the late 80s was one of the turning points which led to a new chapter in Modern Optimization the Interior Point Methods (IPMs). The purpose of this paper is to show the intrinsic connections between Interior and Exterior Point methods (EPMs), which have been developed during the last 30 years. A class Ψ of smooth and strictly concave functions ψ :...

متن کامل

Proximal Point Nonlinear Rescaling Method for Convex Optimization

Nonlinear rescaling (NR) methods alternate finding an unconstrained minimizer of the Lagrangian for the equivalent problem in the primal space (which is an infinite procedure) with Lagrange multipliers update. We introduce and study a proximal point nonlinear rescaling (PPNR) method that preserves convergence and retains a linear convergence rate of the original NR method and at the same time d...

متن کامل

A Method for Solving Convex Quadratic Programming Problems Based on Differential-algebraic equations

In this paper, a new model based on differential-algebraic equations(DAEs) for solving convex quadratic programming(CQP) problems is proposed. It is proved that the new approach is guaranteed to generate optimal solutions for this class of optimization problems. This paper also shows that the conventional interior point methods for solving (CQP) problems can be viewed as a special case of the n...

متن کامل

An Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function

In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 35  شماره 

صفحات  -

تاریخ انتشار 2006